Web Survey Bibliography
In surveys it is problematic to get valid responses when studying topics that respondents find threatening to answer questions about (Lee, 1993). Researchers analyzing such sensitive topics are more likely to be confronted with higher non-response rates and/or socially desirable answers, which threatens the validity of results.
One of the solutions to this problem is to use a Randomized-Response (RR) technique (Warner, 1965; Kuk, 1990; Chaudhuri & Mukerjee, 1988; Fox, 1986). In this technique the answer of the respondent depends in part on a randomization device (typically the throw of two dice) and because of this the interviewer/researcher can no longer determine what the response of the individual respondent was. Because the properties of the randomization procedure are known, valid inferences about the behavior under study can still be made.
Although RR-techniques are well-known and meta-analyses show that they improve the validity of the results (Edgell et al., 1982; Lensvelt-Mulders et al, 2005), they are not often used. Several reasons are given for this in the literature. First, to get a comparable precision, RR-techniques require larger sample sizes. Second, it seems that researchers erroneously think that RR-techniques do not allow for analyses at the individual level.
The usage of RR-techniques online is in its infancy. On the one hand the anonymity of the internet is expected to increase the willingness to answer sensitive questions, on the other hand the implementation of RR-techniques online is complicated by the fact that online implementations of a randomization device are less likely to be trusted to guarantee anonymity by the respondent.
Design In our study we collected survey-data of about 3,250 respondents (initial wave of 1,000 and second wave of 750 from wave 1 and 2,250 new respondents) of a large Dutch Internet-panel (200,000+ members) who answered questions about three sets of potentially sensitive topics: traffic violations, possession of illegal software and music, and answering behavior in the online panel. Our design allows, among others, for comparisons with respect to the following issues:
- Voluntary versus involuntary randomization: For those respondents who do not mind answering the questions honestly, using the randomization procedure produces unnecessary noise. We compare direct question and standard randomization response techniques with voluntary choice as to the method of elicitation.
- Browser based versus locally executable randomization. In most online applications, RR-techniques simulate the throw of dice in the browser. Obviously, it would be easy for the researcher to monitor or manipulate the throw of the electronic dice, undermining the faith of the respondent in the procedure. We compare direct question and standard randomization response techniques with the case where the randomization device is a downloadable application.
- Determinants of compliance and test-retest reliability. Despite the procedure, respondents sometimes answer in socially desirable ways just to make sure that they do not confess to any kind of sensitive behavior. Using multiple RR-measurements, we identify which kinds of respondents are more likely to exhibit this kind of behavior. Moreover, in an initial wave we asked the same sensitive questions directly to 1,000 respondents, which allows for another way of determining compliance and test-retest reliability of the measurement procedures.
The data are being collected end of September, so we cannot present results at the time of submission of this abstract. The analysis of RR-data has thusfar been restricted to somewhat non-standard software (such as R and GLIM). Besides the issues mentioned above, in our presentation we will briefly elaborate on ways to use standard software to analyze RR-data.
General online research (GOR) 2008 (abstract)
Web survey bibliography - Conference proceedings (83)
- Estimation and Adjustment of Self-Selection Bias in Volunteer Panel Web Surveys ; 2016; Niu, Ch.
- Shorter Interviews, Longer Surveys: Optimising the survey participant experience whilst accommodating...; 2016; Halder, A.; Bansal, H. S.; Knowles, R.; Eldridge, J.; Murray, Mi.
- Gamifying. Not all fun and games; 2016; Stubington, P.; Crichton, C.
- Are interviews costing £0.08 a waste of money? Reviewing Google Surveys for Wisdom of the Crowd...; 2016; Roughton, G.; MacKay, I.
- Observations from Twelve Years of an Annual Market Research Technology Survey; 2016; Macer, T.; Wilson, S.
- A Comparison of the Effects of Face-to-Face and Online Deliberation on Young Students’ Attitudes...; 2015; Triantafillidou, A.; Yannas, P.; Lappas, G.; Kleftodimos, A.
- A Privacy-Friendly Method to Reward Participants of Online-Surveys; 2015; Herfert, M.; Lange, B.; Selzer, A.; Waldmann, U.
- Designing Bonsai Surveys: The small but perfectly formed survey experience to meet the needs of the...; 2015; Puleston, J.
- Is accuracy only for probability samples? Comparing probability and non-probability samples in a country...; 2013; Martinsson, J., Dahlberg, S., Lundmark, S.
- The effect of language in answering qualitative questions in user experience evaluation web-surveys; 2013; Walsh, T., Nurkka, P., Petrie, H., Olson, J.
- Beyond Satisfaction Questionnaires: “Hacking” the Online Survey; 2013; Evans, A. L.
- Advancing the field of questionnaire translation - identifying problems, discussing methods, pushing...; 2013; Behr, D., Dorer, B., Van Houten, G
- European Values Study - methodological and substantive applications; 2013; Luijkx, R., Jagodzinski, W.
- The Impact of Culture and Economy on Values and Attitudes; 2013; Duelmer, H., Voicu, M.
- Educational attainment in cross-national surveys: instrument design, data collection, harmonisation...; 2013; Schneider, S.
- Mode Effects in Mixed-Mode Surveys: Prevention, Diagnostics, and Adjustment 1; 2013; de Leeuw, E. D., Dillman, D. A., Schouten, B.
- The smart(phone) way to collect survey data; 2013; Stapleton, C.
- Unintentional mobile respondents; 2012; Peterson, G.
- Metering mobile usage. Insights from global Arbitron mobile trends panel; 2012; Verkasalo, H.
- Is „chapterisation“ a viable alternative to traditional progress indicators ?; 2012; Spicer, R., Dowling, Z.
- Self-administered mobile surveys; 2011; Bosnjak, M.
- Online survey research: Findings, Best practices, and future research; 2011
- Blend, balance, and stabilize respondent sources; 2011; Eggers, M., Drake, E.
- Mode Effect or Question Wording? Measurement Error in Mixed Mode Surveys; 2011; de Leeuw, E. D., Hox, J., Scherpenzeel, A.
- There is an app for that! A review of smartphone apps for marketing research; 2010; Michelson, M.
- The state of online research in the U.S.; 2010; Miller, J.
- A framework for understanding and applying ethical principles in network and security research; 2010; Kenneally, E., Bailey, M., Maughan, D.
- Restructuring and innovations on the survey “capacity of collective tourist accommodation”...; 2010; Santoro, M. T., Staffieri, S.
- An Analyze of the Zero Price Effect on Online Business Performance - An Research Based on the Mobile...; 2010; Liu, Y., Yuan, P.
- Dealing with Nonresponse in Survey Sampling: an Item Response Modeling Approach; 2010; Matei, A.
- Response format effects on measurement of employment; 2009; Thomas, R. K., Dillman, D. A., Smyth, J. D.
- Response Mode and Bias Analysis in the IRS’ Individual Taxpayer Burden Survey; 2009; Brick, J. M., Contos, G., Masken, K., Nord, R.
- Survey Mode Effects in Two Military Surveys; 2009; Yang, M., Falcone, A. E., Milan, L. M.
- Web based macroseismic survey: fast information exchange and elaboration of seismic intensity effects...; 2009; De Rubeis, V., Sbarra P., Sorrentino, D., Tosi, P.
- The representativeness of the LISS panel ; 2009; Knoef, M., de Vos, K.
- Sample factors that influence data quality; 2008; Gailey, R., Teal, D., Haechrel, E.
- An online panel as a platform for multi-disciplinary research; 2008; Scherpenzeel, A.
- Visual Design Effects on on Respondents Behaviour in Web-Surveys. A Design Experiment; 2008; Greinoecker, A.
- Effects of Privacy Assurances on the Online Measurement of Psychological Constructs; 2008; Witzki, A., Kramer, J.
- How Web 2.0 Technologies Can Become a Valuable Part of Online Research; 2008; Jaron, R.
- Respondent Authenticity - A biometrical approach to authenticate panelists; 2008; Wachter, B., Bender, C.
- Not Mixed-Mode but Switch-Mode; 2008; Höglinger, M., Abraham, M., Arpagaus, J.
- The Impact of Cognitive and Computer Skills on Data Quality in Computer Assisted Self Administered Questionnaires...; 2008; Brecko, B. N., Vehovar, V.
- Optimal Contact Strategy in a Mail-and-Web Mixed Mode Survey; 2008; Holmberg, A., Lorenc, B., Werner, P.
- 10 Years of Meinungsplatz.de: Success in the Collection of Data for Targeted Audiences, Such as the...; 2008; Weyergraf, O.
- Self-selection in Online Access Panels: No “Little Difference” in the Recruiting Process...; 2008; Wirth, T.
- Mobile Market Research; 2008; Maxl, E.
- Online vs. Offline in Mobile Surveys; 2008; Neubarth, W., Maier, U.
- Gender-of-Interviewer Effects in Video-Enhanced Web Surveys. Results from a Randomized Field-Experiment...; 2008; Fuchs, M.
- The Online Use of Randomized Response Measurements; 2008; Snijders, C., Weesie, J.